20,926 research outputs found

    New bounds for circulant Johnson-Lindenstrauss embeddings

    Full text link
    This paper analyzes circulant Johnson-Lindenstrauss (JL) embeddings which, as an important class of structured random JL embeddings, are formed by randomizing the column signs of a circulant matrix generated by a random vector. With the help of recent decoupling techniques and matrix-valued Bernstein inequalities, we obtain a new bound k=O(ϵ2log(1+δ)(n))k=O(\epsilon^{-2}\log^{(1+\delta)} (n)) for Gaussian circulant JL embeddings. Moreover, by using the Laplace transform technique (also called Bernstein's trick), we extend the result to subgaussian case. The bounds in this paper offer a small improvement over the current best bounds for Gaussian circulant JL embeddings for certain parameter regimes and are derived using more direct methods.Comment: 11 pages; accepted by Communications in Mathematical Science

    LRMM: Learning to Recommend with Missing Modalities

    Full text link
    Multimodal learning has shown promising performance in content-based recommendation due to the auxiliary user and item information of multiple modalities such as text and images. However, the problem of incomplete and missing modality is rarely explored and most existing methods fail in learning a recommendation model with missing or corrupted modalities. In this paper, we propose LRMM, a novel framework that mitigates not only the problem of missing modalities but also more generally the cold-start problem of recommender systems. We propose modality dropout (m-drop) and a multimodal sequential autoencoder (m-auto) to learn multimodal representations for complementing and imputing missing modalities. Extensive experiments on real-world Amazon data show that LRMM achieves state-of-the-art performance on rating prediction tasks. More importantly, LRMM is more robust to previous methods in alleviating data-sparsity and the cold-start problem.Comment: 11 pages, EMNLP 201
    corecore